We consider Bayesian inference by importance sampling when the likelihood isanalytically intractable but can be unbiasedly estimated. We refer to thisprocedure as importance sampling squared (IS2), as we can often estimate thelikelihood itself by importance sampling. We provide a formal justification forimportance sampling when working with an estimate of the likelihood and studyits convergence properties. We analyze the effect of estimating the likelihoodon the resulting inference and provide guidelines on how to set up theprecision of the likelihood estimate in order to obtain an optimal tradeoff?between computational cost and accuracy for posterior inference on the modelparameters. We illustrate the procedure in empirical applications for ageneralized multinomial logit model and a stochastic volatility model. Theresults show that the IS2 method can lead to fast and accurate posteriorinference under the optimal implementation.
展开▼